A novel family of non-parametric cumulative based divergences for point processes
نویسندگان
چکیده
Hypothesis testing on point processes has several applications such as model fitting, plasticity detection, and non-stationarity detection. Standard tools for hypothesis testing include tests on mean firing rate and time varying rate function. However, these statistics do not fully describe a point process, and therefore, the conclusions drawn by these tests can be misleading. In this paper, we introduce a family of non-parametric divergence measures for hypothesis testing. A divergence measure compares the full probability structure and, therefore, leads to a more robust test of hypothesis. We extend the traditional Kolmogorov–Smirnov and Cramér–von-Mises tests to the space of spike trains via stratification, and show that these statistics can be consistently estimated from data without any free parameter. We demonstrate an application of the proposed divergences as a cost function to find optimally matched point processes.
منابع مشابه
Pattern Learning and Recognition on Statistical Manifolds: An Information-Geometric Review
We review the information-geometric framework for statistical pattern recognition: First, we explain the role of statistical similarity measures and distances in fundamental statistical pattern recognition problems. We then concisely review the main statistical distances and report a novel versatile family of divergences. Depending on their intrinsic complexity, the statistical patterns are lea...
متن کاملA robust wavelet based profile monitoring and change point detection using S-estimator and clustering
Some quality characteristics are well defined when treated as response variables and are related to some independent variables. This relationship is called a profile. Parametric models, such as linear models, may be used to model profiles. However, in practical applications due to the complexity of many processes it is not usually possible to model a process using parametric models.In these cas...
متن کاملTesting statistical hypotheses based on the density power divergence
The family of density power divergences is an useful class which generates robust parameter estimates with high efficiency. None of these divergences require any non-parametric density estimate to carry out the inference procedure. However, these divergences have so far not been used effectively in robust testing of hypotheses. In this paper, we develop tests of hypotheses based on this family ...
متن کاملA family of statistical symmetric divergences based on Jensen's inequality
We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen’s inequality for a convex functional generator. In particular, this family unifies the celebrated Jeffreys divergence with the Jensen-Shannon divergence when the Shannon entropy generator is chosen. We then design a generic algorithm to compute the unique centroid defined as the minimum average d...
متن کاملThe power divergence and the density power divergence families: the mathematical connection
The power divergence family of Cressie and Read (1984) is a highly popular family of density-based divergences which is widely used in robust parametric estimation and multinomial goodness-of-fit testing. This family forms a subclass of the family of φ-divergences (Csiszár, 1963; Pardo, 2006) or disparities (Lindsay, 1994). The more recently described family of density power divergences (Basu e...
متن کامل